AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Small parameter size

# Small parameter size

Qwenphi 4 0.5b Draft
Apache-2.0
Built upon Qwen2.5-0.5B-Instruct, with the vocabulary transplanted from microsoft/phi-4, it can be used as a draft model for Phi-4.
Large Language Model Transformers Supports Multiple Languages
Q
rdsm
27
4
Vda Fine Tuned 2
This model is a fine-tuned version of GroNLP/gpt2-small-italian, suitable for Italian text generation tasks.
Large Language Model Transformers
V
calogero-jerik-scozzaro
15
1
Electra Small Turkish Uncased Discriminator Finetuned Lr 2e 05 Epochs 3
This model is a small Turkish discriminator based on the ELECTRA architecture, fine-tuned on the Turkish SQuAD dataset
Question Answering System Transformers
E
husnu
20
0
Gpt2 Finnish
Apache-2.0
Finnish language model pre-trained on GPT-2 architecture, 117M parameter version
Large Language Model Other
G
Finnish-NLP
201
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase